L1-norm Penalized Least Squares with Salsa

نویسنده

  • IVAN SELESNICK
چکیده

This lecture note describes an iterative optimization algorithm, ‘SALSA’, for solving L1-norm penalized least squares problems. We describe the use of SALSA for sparse signal representation and approximation, especially with overcomplete Parseval transforms. We also illustrate the use of SALSA to perform basis pursuit (BP), basis pursuit denoising (BPD), and morphological component analysis (MCA). The algorithm, ‘SALSA’, was developed by Afonso, Bioucas-Dias, and Figueiredo.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Reweighted l1-norm Penalized LMS for Sparse Channel Estimation and Its Analysis

A new reweighted l1-norm penalized least mean square (LMS) algorithm for sparse channel estimation is proposed and studied in this paper. Since standard LMS algorithm does not take into account the sparsity information about the channel impulse response (CIR), sparsity-aware modifications of the LMS algorithm aim at outperforming the standard LMS by introducing a penalty term to the standard LM...

متن کامل

Lq Matrix Completion

Rank minimization problems, which consist of finding a matrix of minimum rank subject to linear constraints, have been proposed in many areas of engineering and science. A specific problem is the matrix completion problem in which a low rank data matrix is recovered from incomplete samples of its entries by solving a rank penalized least squares problem. The rank penalty is in fact the l0 norm ...

متن کامل

Sparse channel estimation with lp-norm and reweighted l1-norm penalized least mean squares

The least mean squares (LMS) algorithm is one of the most popular recursive parameter estimation methods. In its standard form it does not take into account any special characteristics that the parameterized model may have. Assuming that such model is sparse in some domain (for example, it has sparse impulse or frequency response), we aim at developing such LMS algorithms that can adapt to the ...

متن کامل

Norm Penalized Joint-Optimization NLMS Algorithms for Broadband Sparse Adaptive Channel Estimation

A joint-optimization method is proposed for enhancing the behavior of the l1-normand sum-log norm-penalized NLMS algorithms to meet the requirements of sparse adaptive channel estimations. The improved channel estimation algorithms are realized by using a state stable model to implement a joint-optimization problem to give a proper trade-off between the convergence and the channel estimation be...

متن کامل

Asymptotic distribution and sparsistency for l1 penalized parametric M-estimators, with applications to linear SVM and logistic regression

Since its early use in least squares regression problems, the l1-penalization framework for variable selection has been employed in conjunction with a wide range of loss functions encompassing regression, classification and survival analysis. While a well developed theory exists for the l1-penalized least squares estimates, few results concern the behavior of l1-penalized estimates for general ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2014